Skip to content

Conversation

@anoto-moniz
Copy link
Collaborator

Drop the data manager and v3.0 migration docs, as they shouldn't be needed any more. Adds a FAQ to clearly demonstrate the usage of predictor_evaluators.

PR Type:

  • Breaking change (fix or feature that would cause existing functionality to change)
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Maintenance (non-breaking change to assist developers)

Adherence to team decisions

  • I have added tests for 100% coverage
  • I have written Numpy-style docstrings for every method and class.
  • I have communicated the downstream consequences of the PR to others.
  • I have bumped the version in __version__.py

Copy link
Collaborator

@jspeerless jspeerless left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some minor changes - but otherwise LGTM!


evaluators = project.predictor_evaluations.default(predictor_id=predictor.uid)

Even if the predictor hasn't been registered to platform yet:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Even if the predictor hasn't been registered to platform yet:
You can evaluate your predictor even if it hasn't been registered to platform yet:

Working With Evaluators
=======================

You can also run your predictor against a list of specific evaluators:
Copy link
Collaborator

@jspeerless jspeerless Oct 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think maybe we should just add a note that Evaluators (such as CrossValidationEvaluators) can be configured the same as they always have. The only difference is that they can be executed directly against a predictor to get evaluation results without the need for a registered Predictor Evaluation Workflow.

@anoto-moniz anoto-moniz force-pushed the predictor-evaluations-migration branch from 518e98e to 9dd3074 Compare October 1, 2025 17:00
Drop the data manager and v3.0 migration docs, as they shouldn't be
needed any more. Adds a FAQ to clearly demonstrate the usage of
predictor_evaluators.
@anoto-moniz anoto-moniz force-pushed the predictor-evaluations-migration branch from 9dd3074 to 5f24de8 Compare October 1, 2025 17:02
@anoto-moniz anoto-moniz requested a review from jspeerless October 1, 2025 18:54
@anoto-moniz anoto-moniz marked this pull request as ready for review October 1, 2025 18:56
@anoto-moniz anoto-moniz requested a review from a team as a code owner October 1, 2025 18:56
Copy link
Collaborator

@jspeerless jspeerless left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks Great!

Copy link
Collaborator

@kroenlein kroenlein left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's written looks good.

The Example of working with the AI Engine still references

workflow = PredictorEvaluationWorkflow(
    name='workflow that evaluates y',
    evaluators=[evaluator]
)

rather than the new workflow. But maybe that's a different PR.

@anoto-moniz
Copy link
Collaborator Author

What's written looks good.

The Example of working with the AI Engine still references

workflow = PredictorEvaluationWorkflow(
    name='workflow that evaluates y',
    evaluators=[evaluator]
)

rather than the new workflow. But maybe that's a different PR.

Good catch! I'll knock that out in a quick follow-up, so James can release earlier if he wishes.

@anoto-moniz anoto-moniz merged commit 80bcd5b into main Oct 6, 2025
45 checks passed
@anoto-moniz anoto-moniz deleted the predictor-evaluations-migration branch October 6, 2025 13:05
@anoto-moniz
Copy link
Collaborator Author

anoto-moniz commented Oct 6, 2025

What's written looks good.
The Example of working with the AI Engine still references

workflow = PredictorEvaluationWorkflow(
    name='workflow that evaluates y',
    evaluators=[evaluator]
)

rather than the new workflow. But maybe that's a different PR.

Good catch! I'll knock that out in a quick follow-up, so James can release earlier if he wishes.

Never mind. Those docs are only published on release, and I've already made those (and many more) docs changes in another PR. So we're all set.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants